Invited talk: Dependency Representations, Grammars, Folded Structures, among Other Things!
نویسنده
چکیده
of invited talk Aravind K. Joshi Department of Computer and Information Science and Institute for Research in Cognitive Science University of Pennsylvania Philadelphia PA USA [email protected] In a dependency grammar (DG) dependency rep resentations (trees) directly express the depen dency relations between words. The hierarchical structure emerges out of the representation. There are no labels other than the words them selves. In a phrase structure type of representa tion words are associated with some category la bels and then the dependencies between the words emerge indirectly in terms of the phrase structure, the nonterminal labels, and possibly some indices associated with the labels. Behind the scene there is a phrase structure grammar (PSG) that builds the hierarchical structure. In a categorical type of grammar (CG), words are as sociated with labels that encode the combinatory potential of each word. Then the hierarchical structure (tree structure) emerges out of a set of operations such as application, function composi tion, type raising, among others. In a treeadjoin ing grammar (TAG), each word is associated with an elementary tree that encodes both the hi erarchical and the dependency structure associ ated with the lexical anchor and the tree(s) asso ciated with a word. The elementary trees are then composed with the operations of substitution and adjoining. In a way, the dependency potential of a word is localized within the elementary tree (trees) associated with a word. Already TAG and TAG like grammars are able to represent dependencies that go beyond those that can be represented by contextfree grammars, but in a controlled way. With this perspective and with the availability of larger dependency annotated corpora (e.g. the Prague Dependency Treebank) one is able to as sess how far one can cover the dependencies that actually appear in the corpora. This approach has the potential of carrying out an ‘empirical’ inves tigation of the power of representations and the associated grammars. Here by ‘empirical’ we do not mean ‘statistical or distributional’ but rather in the sense of covering as much as possible the actual data in annotated corpora! If time permits, I will talk about how dependen cies are represented in nature. For example, grammars have been used to describe the folded structure of RNA biomolecules. The folded structure here describes the dependencies be tween the amino acids as they appear in an RNA biomolecule. One can then ask the question: Can we represent a sentence structure as a folded structure, where the fold captures both the depen dencies and the structure, without any additional labels? * Part of this work is in cooperation with Joan Chen Main, University of Pennsylvania, Philadelphia PA and Johns Hopkins University Baltimore, MD.
منابع مشابه
Things between Lexicon and Grammar
A number of grammar formalisms were proposed in 80’s, such as Lexical Functional Grammars, Generalized Phrase Structure Grammars, and Tree Adjoining Grammars. Those formalisms then started to put a stress on lexicon, and were called as lexicalist (or lexicalized) grammars. Representative examples of lexicalist grammars were Head-driven Phrase Structure Grammars (HPSG) and Lexicalized Tree Adjoi...
متن کامل<b>Invited Talk:</b> Slacker Semantics: Why Superficiality, Dependency and Avoidance of Commitment can be the Right Way to Go
This paper discusses computational compositional semantics from the perspective of grammar engineering, in the light of experience with the use of Minimal Recursion Semantics in DELPH-IN grammars. The relationship between argument indexation and semantic role labelling is explored and a semantic dependency notation (DMRS) is introduced.
متن کاملA Fully Lexicalized Grammar for French Based on Meaning-Text Theory (Invited Talk)
The paper presents a formal lexicalized dependency grammar based on Meaning-Text theory. This grammar associates semantic graphs with sentences. We propose a fragment of a grammar for French, including the description of extractions. The main particularity of our grammar is it that it builds bubble trees as syntactic representations, that is, trees whose nodes can be filled by bubbles, which ca...
متن کاملMildly Non-Projective Dependency Grammar
Syntactic representations based on word-to-word dependencies have a long tradition in descriptive linguistics, and receive considerable interest in many computational applications. However, dependency syntax has remained somewhat of an island from a formal point of view, which hampers the exchange of resources and computational methods with other syntactic traditions. In this article, we presen...
متن کاملTwenty - Fifth International Conference on Machine Learning
Modeling language at the syntactic or semantic level is a key problem in natural languageprocessing, and involves a challenging set of structured prediction problems. In this talkI’ll describe work on machine learning approaches for syntax and semantics, with a particu-lar focus on lexicalized grammar formalisms such as dependency grammars, tree adjoininggrammars, and catego...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2013